Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

نویسندگان

  • Cun Mu
  • Bo Huang
  • John Wright
  • Donald Goldfarb
چکیده

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms of the unfoldings of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way tensor of length n and Tucker rank r from Gaussian measurements requires Ω(rnK−1) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r +nrK) observations. We introduce a very simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(rbK/2cndK/2e) observations. While these results pertain to Gaussian measurements, simulations strongly suggest that the new norm also outperforms the sum of nuclear norms for tensor completion from a random subset of entries. Our lower bound for the sum-of-nuclear-norms model follows from a new result on recovering signals with multiple sparse structures (e.g. sparse, low rank), which perhaps surprisingly demonstrates the significant suboptimality of the commonly used recovery approach via minimizing the sum of individual sparsity inducing norms (e.g. l1, nuclear norm). Our new formulation for low-rank tensor recovery however opens the possibility in reducing the sample complexity by exploiting several structures jointly.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Square Deal: Lower Bounds and Improved Convex Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing andmachine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a Kway n⇥n⇥· · ·⇥n tensor of Tucker rank (r, r, . . . , r)...

متن کامل

On the Exponent of Triple Tensor Product of p-Groups

The non-abelian tensor product of groups which has its origins in algebraic K-theory as well as inhomotopy theory, was introduced by Brown and Loday in 1987. Group theoretical aspects of non-abelian tensor products have been studied extensively. In particular, some studies focused on the relationship between the exponent of a group and exponent of its tensor square. On the other hand, com...

متن کامل

Strong exponent bounds for the local Rankin-Selberg convolution

Let $F$ be a non-Archimedean locally compact field‎. ‎Let $sigma$ and $tau$ be finite-dimensional representations of the Weil-Deligne group of $F$‎. ‎We give strong upper and lower bounds for the Artin and Swan exponents of $sigmaotimestau$ in terms of those of $sigma$ and $tau$‎. ‎We give a different lower bound in terms of $sigmaotimeschecksigma$ and $tauotimeschecktau$‎. ‎Using the Langlands...

متن کامل

Tensor principal component analysis via sum-of-square proofs

We study a statistical model for the tensor principal component analysis problem introduced by Montanari and Richard: Given a order-3 tensor T of the form T = τ · v⊗3 0 + A, where τ > 0 is a signal-to-noise ratio, v0 is a unit vector, and A is a random noise tensor, the goal is to recover the planted vector v0. For the case that A has iid standard Gaussian entries, we give an efficient algorith...

متن کامل

Tensor sparsification via a bound on the spectral norm of random tensors

Given an order-d tensor A ∈ Rn×n×...×n, we present a simple, element-wise sparsification algorithm that zeroes out all sufficiently small elements of A, keeps all sufficiently large elements of A, and retains some of the remaining elements with probabilities proportional to the square of their magnitudes. We analyze the approximation accuracy of the proposed algorithm using a powerful inequalit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014